information à la source - définition. Qu'est-ce que information à la source
Diclib.com
Dictionnaire ChatGPT
Entrez un mot ou une phrase dans n'importe quelle langue 👆
Langue:

Traduction et analyse de mots par intelligence artificielle ChatGPT

Sur cette page, vous pouvez obtenir une analyse détaillée d'un mot ou d'une phrase, réalisée à l'aide de la meilleure technologie d'intelligence artificielle à ce jour:

  • comment le mot est utilisé
  • fréquence d'utilisation
  • il est utilisé plus souvent dans le discours oral ou écrit
  • options de traduction de mots
  • exemples d'utilisation (plusieurs phrases avec traduction)
  • étymologie

Qu'est-ce (qui) est information à la source - définition

Source information rate

The Source (Ingres)         
  • Marble sculpture based on ''The Source'', commissioned by [[Dorabji Tata]] and now in the [[Chhatrapati Shivaji Maharaj Vastu Sangrahalaya]], [[Mumbai]]
PAINTING BY JEAN AUGUSTE DOMINIQUE INGRES
La Source Fountain
The Source () is an oil painting on canvas by French neoclassical painter Jean-Auguste-Dominique Ingres. The work was begun in Florence around 1820 and not completed until 1856, in Paris.
Information source (mathematics)         
SEQUENCE OF RANDOM MATHEMATICAL VARIABLES
Information source (Mathematics)
In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.
À la suite         
  • [[Field marshal]] [[August von Mackensen]] wearing à la suite the uniform of the 1st Life Hussars Regiment of the [[Prussian Army]].
HONORIFIC OFFICER APPOINTMENT IN GERMAN ARMIES
General à la Suite; A la suite; À la Suite
À la suite (, in the entourage [of]) was a military title given to those who were allotted to the army or a particular unit for honour's sake,BROCKHAUS, Die Enzyklopädie in 24 Bänden (1796–2001), A-AP Band 1: , p. 316, definition: à la suit and entitled to wear a regimental uniform but otherwise had no official position.

Wikipédia

Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H ( X ) {\displaystyle H(X)} is the limit of the joint entropy of n {\displaystyle n} members of the process X k {\displaystyle X_{k}} divided by n {\displaystyle n} , as n {\displaystyle n} tends to infinity:

H ( X ) = lim n 1 n H ( X 1 , X 2 , X n ) {\displaystyle H(X)=\lim _{n\to \infty }{\frac {1}{n}}H(X_{1},X_{2},\dots X_{n})}

when the limit exists. An alternative, related quantity is:

H ( X ) = lim n H ( X n | X n 1 , X n 2 , X 1 ) {\displaystyle H'(X)=\lim _{n\to \infty }H(X_{n}|X_{n-1},X_{n-2},\dots X_{1})}

For strongly stationary stochastic processes, H ( X ) = H ( X ) {\displaystyle H(X)=H'(X)} . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property. The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning.